Chebyshev Approximation via Polynomial Mappings and the Convergence Behaviour of Krylov Subspace Methods

نویسندگان

  • BERND FISCHER
  • FRANZ PEHERSTORFER
چکیده

Abstract. Let φm be a polynomial satisfying some mild conditions. Given a set R ⊂ C, a continuous function f on R and its best approximation p n−1 from Πn−1 with respect to the maximum norm, we show that p ∗ n−1 ◦φm is a best approximation to f ◦ φm on the inverse polynomial image S of R, i.e. φm(S) = R, where the extremal signature is given explicitly. A similar result is presented for constrained Chebyshev polynomial approximation. Finally, we apply the obtained results to the computation of the convergence rate of Krylov subspace methods when applied to a preconditioned linear system. We investigate pairs of preconditioners where the eigenvalues are contained in sets S and R, respectively, which are related by φm(S) = R.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Convergence of Restarted Krylov Subspaces to Invariant Subspaces

The performance of Krylov subspace eigenvalue algorithms for large matrices can be measured by the angle between a desired invariant subspace and the Krylov subspace. We develop general bounds for this convergence that include the effects of polynomial restarting and impose no restrictions concerning the diagonalizability of the matrix or its degree of nonnormality. Associated with a desired se...

متن کامل

Chebyshev Semi-iteration in Preconditioning for Problems including the Mass Matrix

It is widely believed that Krylov subspace iterative methods are better than Chebyshev semi-iterative methods. When the solution of a linear system with a symmetric and positive definite coefficient matrix is required, the Conjugate Gradient method will compute the optimal approximate solution from the appropriate Krylov subspace, that is, it will implicitly compute the optimal polynomial. Henc...

متن کامل

Chebyshev semi-iteration in Preconditioning

It is widely believed that Krylov subspace iterative methods are better than Chebyshev semi-iterative methods. When the solution of a linear system with a symmetric and positive definite coefficient matrix is required then the Conjugate Gradient method will compute the optimal approximate solution from the appropriate Krylov subspace, that is, it will implicitly compute the optimal polynomial. ...

متن کامل

Chebyshev acceleration techniques for large complex non hermitian eigenvalue problems

The computation of a few eigenvalues and the corresponding eigenvectors of large complex non hermitian matrices arises in many applications in science and engineering such as magnetohydrodynamic or electromagnetism [6], where the eigenvalues of interest often belong to some region of the complex plane. If the size of the matrices is relatively small, then the problem can be solved by the standa...

متن کامل

The Chebyshev Polynomials of a Matrix

A Chebyshev polynomial of a square matrix A is a monic polynomial p of specified degree that minimizes ‖p(A)‖2. The study of such polynomials is motivated by the analysis of Krylov subspace iterations in numerical linear algebra. An algorithm is presented for computing these polynomials based on reduction to a semidefinite program which is then solved by a primaldual interior point method. Exam...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2001